2 research outputs found
Multifidelity Covariance Estimation via Regression on the Manifold of Symmetric Positive Definite Matrices
We introduce a multifidelity estimator of covariance matrices formulated as
the solution to a regression problem on the manifold of symmetric positive
definite matrices. The estimator is positive definite by construction, and the
Mahalanobis distance minimized to obtain it possesses properties which enable
practical computation. We show that our manifold regression multifidelity
(MRMF) covariance estimator is a maximum likelihood estimator under a certain
error model on manifold tangent space. More broadly, we show that our
Riemannian regression framework encompasses existing multifidelity covariance
estimators constructed from control variates. We demonstrate via numerical
examples that our estimator can provide significant decreases, up to one order
of magnitude, in squared estimation error relative to both single-fidelity and
other multifidelity covariance estimators. Furthermore, preservation of
positive definiteness ensures that our estimator is compatible with downstream
tasks, such as data assimilation and metric learning, in which this property is
essential.Comment: 30 pages + 15-page supplemen
Multi-Fidelity Covariance Estimation in the Log-Euclidean Geometry
We introduce a multi-fidelity estimator of covariance matrices that employs
the log-Euclidean geometry of the symmetric positive-definite manifold. The
estimator fuses samples from a hierarchy of data sources of differing
fidelities and costs for variance reduction while guaranteeing definiteness, in
contrast with previous approaches. The new estimator makes covariance
estimation tractable in applications where simulation or data collection is
expensive; to that end, we develop an optimal sample allocation scheme that
minimizes the mean-squared error of the estimator given a fixed budget.
Guaranteed definiteness is crucial to metric learning, data assimilation, and
other downstream tasks. Evaluations of our approach using data from physical
applications (heat conduction, fluid dynamics) demonstrate more accurate metric
learning and speedups of more than one order of magnitude compared to
benchmarks.Comment: To appear at the International Conference on Machine Learning (ICML)
202